Large-Scale Heteroscedastic Regression via Gaussian Process

نویسندگان

چکیده

Heteroscedastic regression considering the varying noises among observations has many applications in fields, such as machine learning and statistics. Here, we focus on heteroscedastic Gaussian process (HGP) that integrates latent function noise a unified nonparametric Bayesian framework. Though showing remarkable performance, HGP suffers from cubic time complexity, which strictly limits its application to big data. To improve scalability, first develop variational sparse inference algorithm, named VSHGP, handle large-scale data sets. Furthermore, two variants are developed scalability capability of VSHGP. The is stochastic VSHGP (SVSHGP) derives factorized evidence lower bound, thus enhancing efficient inference. second distributed (DVSHGP) follows committee formalism distribute computations over multiple local experts with inducing points adopts hybrid parameters for guard against overfitting capture variety. superiority DVSHGP SVSHGP compared existing scalable HGP/homoscedastic GP then extensively verified various

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Variational Heteroscedastic Gaussian Process Regression

Standard Gaussian processes (GPs) model observations’ noise as constant throughout input space. This is often a too restrictive assumption, but one that is needed for GP inference to be tractable. In this work we present a non-standard variational approximation that allows accurate inference in heteroscedastic GPs (i.e., under inputdependent noise conditions). Computational cost is roughly twic...

متن کامل

Large-Scale Gaussian Process Regression via Doubly Stochastic Gradient Descent

Gaussian process regression (GPR) is a popular tool for nonlinear function approximation. Unfortunately, GPR can be difficult to use in practice due to the O(n) memory and O(n) processing requirements for n training data points. We propose a novel approach to scaling up GPR to handle large datasets using the recent concept of doubly stochastic functional gradients. Our approach relies on the fa...

متن کامل

Gaussian Process Regression with Heteroscedastic or Non-Gaussian Residuals

Abstract Gaussian Process (GP) regression models typically assume that residuals are Gaussian and have the same variance for all observations. However, applications with input-dependent noise (heteroscedastic residuals) frequently arise in practice, as do applications in which the residuals do not have a Gaussian distribution. In this paper, we propose a GP Regression model with a latent variab...

متن کامل

Greedy Block Coordinate Descent for Large Scale Gaussian Process Regression

We propose a variable decomposition algorithm– greedy block coordinate descent (GBCD)–in order to make dense Gaussian process regression practical for large scale problems. GBCD breaks a large scale optimization into a series of small sub-problems. The challenge in variable decomposition algorithms is the identification of a subproblem (the active set of variables) that yields the largest impro...

متن کامل

Patchwork Kriging for Large-scale Gaussian Process Regression

This paper presents a new approach for Gaussian process (GP) regression for large datasets. The approach involves partitioning the regression input domain into multiple local regions with a different local GP model fitted in each region. Unlike existing local partitioned GP approaches, we introduce a technique for patching together the local GP models nearly seamlessly to ensure that the local ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE transactions on neural networks and learning systems

سال: 2021

ISSN: ['2162-237X', '2162-2388']

DOI: https://doi.org/10.1109/tnnls.2020.2979188